13 research outputs found

    Multimodal Imaging of Silver Nanoclusters

    Get PDF
    Recent developments in Nanobiotechnology have given rise to a novel brand of fluorescent labels, fluorescent metal nanoclusters, e.g., gold and silver nanoclusters. Generally, high atomic number elements such as silver can attenuate more X-ray and consider as label in X-ray microtomography. Features such as ultra-small size, good biocompatibility, non-toxicity and photo-stability made nanoclusters more attractive as a fluorescent label than conventional fluorophores dye in biological imaging. The core concept of this thesis is to analyze silver nanoclusters as contrast agent by the multimodal imaging approaches of X-ray microtomography (MicroCT) and Optical Projection Tomography (OPT). To estimate the absorption and relation of X-ray and fluorescent signal by different concentrations of silver nanoclusters in samples. AgNCs-Agar with different concentrations of AgNCs, diluted with agar and water and filter paper coated with silver nanoclusters with different dipping time were studied in this work. The imaging implementation divided into three parts: 1. MicroCT imaging of samples (both AgNC-Agar and filter paper), 2. Optical imaging of AgNC-Agar samples by both fluorescent and bright-field modes. 3. MicroCT imaging of samples which were imaged by OPT first. Afterward, quantitative approach employed to both microCT and optical images to evaluate the relation between X-ray energy and light intensity with different concentrations of AgNCs to assess the amount of X-ray and light absorption by samples. Ideally, higher ratio of AgNCs revealed brighter microCT images due to more X-ray absorption. In sum, our results showed that the tested silver nanoclusters can be used as a label in both X-ray microtomography and fluorescent OPT since they show the contrast in X-ray and optical images. Moreover, depicted graphs demonstrate the linear correlation between data from images of both modalities and different amounts of silver material

    Randomized Multiresolution Scanning in Focal and Fast E/MEG Sensing of Brain Activity with a Variable Depth

    Get PDF
    We focus on electromagnetoencephalography imaging of the neural activity and, in particular, finding a robust estimate for the primary current distribution via the hierarchical Bayesian model (HBM). Our aim is to develop a reasonably fast maximum a posteriori (MAP) estimation technique which would be applicable for both superficial and deep areas without specific a priori knowledge of the number or location of the activity. To enable source distinguishability for any depth, we introduce a randomized multiresolution scanning (RAMUS) approach in which the MAP estimate of the brain activity is varied during the reconstruction process. RAMUS aims to provide a robust and accurate imaging outcome for the whole brain, while maintaining the computational cost on an appropriate level. The inverse gamma (IG) distribution is applied as the primary hyperprior in order to achieve an optimal performance for the deep part of the brain. In this proof-of-the-concept study, we consider the detection of simultaneous thalamic and somatosensory activity via numerically simulated data modeling the 14-20 ms post-stimulus somatosensory evoked potential and field response to electrical wrist stimulation. Both a spherical and realistic model are utilized to analyze the source reconstruction discrepancies. In the numerically examined case, RAMUS was observed to enhance the visibility of deep components and also marginalizing the random effects of the discretization and optimization without a remarkable computation cost. A robust and accurate MAP estimate for the primary current density was obtained in both superficial and deep parts of the brain.Comment: Brain Topogr (2020

    Zeffiro user interface for electromagnetic brain imaging: a GPU accelerated FEM tool for forward and inverse computations in Matlab

    Get PDF
    This article introduces the Zeffiro interface (ZI) version 2.2 for brain imaging. ZI aims to provide a simple, accessible and multimodal open source platform for finite element method (FEM) based and graphics processing unit (GPU) accelerated forward and inverse computations in the Matlab environment. It allows one to (1) generate a given multi-compartment head model, (2) to evaluate a lead field matrix as well as (3) to invert and analyze a given set of measurements. GPU acceleration is applied in each of the processing stages (1)-(3). In its current configuration, ZI includes forward solvers for electro-/magnetoencephalography (EEG) and linearized electrical impedance tomography (EIT) as well as a set of inverse solvers based on the hierarchical Bayesian model (HBM). We report the results of EEG and EIT inversion tests performed with real and synthetic data, respectively, and demonstrate numerically how the inversion parameters affect the EEG inversion outcome in HBM. The GPU acceleration was found to be essential in the generation of the FE mesh and the LF matrix in order to achieve a reasonable computing time. The code package can be extended in the future based on the directions given in this article

    Conditionally Exponential Prior in Focal Near- and Far-Field EEG Source Localization via Randomized Multiresolution Scanning (RAMUS)

    Get PDF
    In this paper, we focus on the inverse problem of reconstructing distributional brain activity with cortical and weakly detectable deep components in non-invasive Electroencephalography. We consider a recently introduced hybrid reconstruction strategy combining a hierarchical Bayesian model to incorporate a priori information and the advanced randomized multiresolution scanning (RAMUS) source space decomposition approach to reduce modelling errors, respectively. In particular, we aim to generalize the previously extensively used conditionally Gaussian prior (CGP) formalism to achieve distributional reconstructions with higher focality. For this purpose, we introduce as a hierarchical prior, a general exponential distribution, which we refer to as conditionally exponential prior (CEP). The first-degree CEP corresponds to focality enforcing Laplace prior, but it also suffers from strong depth bias, when applied in numerical modelling, making the deep activity unrecoverable. We sample over multiple resolution levels via RAMUS to reduce this bias as it is known to depend on the resolution of the source space. Moreover, we introduce a procedure based on the physiological a priori knowledge of the brain activity to obtain the shape and scale parameters of the gamma hyperprior that steer the CEP. The posterior estimates are calculated using iterative statistical methods, expectation maximization and iterative alternating sequential algorithm, which we show to be algorithmically similar and to have a close resemblance to the iterative ā„“1 and ā„“2 reweighting methods. The performance of CEP is compared with the recent sampling-based dipole localization method Sequential semi-analytic Monte Carlo estimation (SESAME) in numerical experiments of simulated somatosensory evoked potentials related to the human median nerve stimulation. Our results obtained using synthetic sources suggest that a hybrid of the first-degree CEP and RAMUS can achieve an accuracy comparable to the second-degree case (CGP) while being more focal. Further, the proposed hybrid is shown to be robust to noise effects and compares well with the dipole reconstructions obtained with SESAME.publishedVersionPeer reviewe

    L1-norm vs. L2-norm fitting in optimizing focal multi-channel tES stimulation : linear and semidefinite programming vs. weighted least squares

    Get PDF
    Background and Objective: This study focuses on Multi-Channel Transcranial Electrical Stimulation, a non-invasive brain method for stimulating neuronal activity under the influence of low-intensity currents. We introduce a mathematical formulation for finding a current pattern that optimizes an L1-norm fit between a given focal target distribution and volumetric current density inside the brain. L1-norm is well-known to favor well-localized or sparse distributions compared to L2-norm (least-squares) fitted estimates. Methods: We present a linear programming approach that performs L1-norm fitting and penalization of the current pattern (L1L1) to control the number of non-zero currents. The optimizer filters a large set of candidate solutions using a two-stage metaheuristic search from a pre-filtered set of candidates. Results: The numerical simulation results obtained with both 8- and 20-channel electrode montages suggest that our hypothesis on the benefits of L1-norm data fitting is valid. Compared to an L1-norm regularized L2-norm fitting (L1L2) via semidefinite programming and weighted Tikhonov least-squares method (TLS), the L1L1 results were overall preferable for maximizing the focused current density at the target position, and the ratio between focused and nuisance current magnitudes. Conclusions: We propose the metaheuristic L1L1 optimization approach as a potential technique to obtain a well-localized stimulus with a controllable magnitude at a given target position. L1L1 finds a current pattern with a steep contrast between the anodal and cathodal electrodes while suppressing the nuisance currents in the brain, hence, providing a potential alternative to modulate the effects of the stimulation, e.g., the sensation experienced by the subject.publishedVersionPeer reviewe

    Forward and Inverse Modeling via Finite Elements in EEG/MEG Source Localization : Application to Event Related Responses

    Get PDF
    This thesis aims at advancing the development of forward and inverse modeling techniques to solve the electromagnetic inverse problems arising in electro and magnetoencephalography (EEG and MEG) of the human brain. A ļ¬nite element method (FEM)-based and divergence conforming H(div) forward modeling approach is applied to obtain the electric and magnetic ļ¬eld of the neural activity in a thin, heavily folded and multicompartment head model. This accurate H(div) approach enables inversion techniques to localize the primary current distribution of the brain robustly. Furthermore, this thesis introduces the Zefļ¬ro Interface (ZI) code package which provides a platform for integrating forward and inverse solvers for a realistic head model. ZI uses graphics processing unit (GPU) acceleration and can, therefore, ļ¬‚exibly utilize ļ¬nite element (FE) models with a high 1 mm accuracy. Herein, ZI is applied in method development and experimental studies. In this thesis, a source localization approach is built upon conditionally Gaussian hierarchical Bayesian modeling (HBM), the iterative alternating sequential (IAS) reconstruction technique, a variable resolution of the source space, and random sampling. These different aspects are combined in the randomized multiresolution scanning (RAMUS) method, which is introduced as a strategy to marginalize the effect of discretization and optimization errors and thereby, minimize the depth-bias of the reconstructed activity. A prior-over-measurement signal-to-noise ratio (PM-SNR) is introduced as a way to choose hyperprior parameters for a given mesh resolution and noise level. The proposed methods are investigated using simulated and experimental somatosensory evoked potentials and ļ¬elds (SEPs and SEFs). RAMUS was found to be a promising technique to distinguish the subcortical activity of the brain, which might occur simultaneously with cortical components. The non-invasive detection of subcortical activity is a scientiļ¬cally important and timely topic which can have remarkable implications for the treatment of Alzheimerā€™s or Parkinsonā€™s disease and, in particular, for localizing refractory epilepsy

    Parametrizing the Conditionally Gaussian Prior Model for Source Localization with Reference to the P20/N20 Component of Median Nerve SEP/SEF

    Get PDF
    In this article, we focused on developing the conditionally Gaussian hierarchical Bayesian model (CG-HBM), which forms a superclass of several inversion methods for source localization of brain activity using somatosensory evoked potential (SEP) and field (SEF) measurements. The goal of this proof-of-concept study was to improve the applicability of the CG-HBM as a superclass by proposing a robust approach for the parametrization of focal source scenarios. We aimed at a parametrization that is invariant with respect to altering the noise level and the source space size. The posterior difference between the gamma and inverse gamma hyperprior was minimized by optimizing the shape parameter, while a suitable range for the scale parameter can be obtained via the prior-over-measurement signal-to-noise ratio, which we introduce as a new concept in this study. In the source localization experiments, the primary generator of the P20/N20 component was detected in the Brodmann area 3b using the CG-HBM approach and a parameter range derived from the existing knowledge of the Tikhonov-regularized minimum norm estimate, i.e., the classical Gaussian prior model. Moreover, it seems that the detection of deep thalamic activity simultaneously with the P20/N20 component with the gamma hyperprior can be enhanced while using a close-to-optimal shape parameter value.publishedVersionPeer reviewe

    Autophagy orchestrates resistance in hepatocellular carcinoma cells

    No full text
    Treatment resistance is one of the major barriers for therapeutic strategies in hepatocellular carcinoma (HCC). Many studies have indicated that chemotherapy and radiotherapy induce autophagy machinery (cell protective autophagy) in HCC cells. In addition, many experiments report a remarkable crosstalk between treatment resistance and autophagy pathways. Thus, autophagy could be one of the key factors enabling tumor cells to hinder induced cell death after medical interventions. Therefore, extensive research on the molecular pathways involved in resistance induction and autophagy have been conducted to achieve the desired therapeutic response. The key molecular pathways related to the therapy resistance are TGF-Ī², MAPK, NRF2, NF-ĪŗB, and non-coding RNAs. In addition, EMT, drug transports, apoptosis evasion, DNA repair, cancer stem cells, and hypoxia could have considerable impact on the hepatoma cellā€™s response to therapies. These mechanisms protect tumor cells against various treatments and many studies have shown that each of them is connected to the molecular pathways of autophagy induction in HCC. Hence, autophagy inhibition may be an effective strategy to improve therapeutic outcome in HCC patients. In this review, we further highlight how autophagy leads to poor response during treatment through a complex molecular network and how it enhances resistance in primary liver cancer. We propose that combinational regimens of approved HCC therapeutic protocols plus autophagy inhibitors may overcome drug resistance in HCC therapy

    Highly Adaptive and Automated Tetrahedral Mesh Generator for Multi-Compartment Human Head Model with Deep Brain Structures in EEG

    Full text link
    This paper introduces a highly adaptive and automated approach for generating Finite Element (FE) discretization for a given realistic multi-compartment human head model obtained through magnetic resonance imaging (MRI) dataset. We aim at obtaining accurate tetrahedral FE meshes for electroencephalographic source localization. We present recursive solid angle labeling for the surface segmentation of the model and then adapt it with a set of smoothing, inflation, and optimization routines to further enhance the quality of the FE mesh. The results show that our methodology can produce FE mesh with an accuracy greater than 1 millimeter, significant with respect to both their 3D structure discretization outcome and electroencephalographic source localization estimates. FE meshes can be achieved for the human head including complex deep brain structures. Our algorithm has been implemented using the open Matlab-based Zeffiro Interface toolbox with it effective time-effective parallel computing system.

    A mathematical model and iterative inversion for fluorescent optical projection tomography

    Get PDF
    Solving the fluorophore distribution in a tomographic setting has been difficult because of the lack of physically meaningful and computationally applicable propagation models. This study concentrates on the direct modelling of fluorescence signals in optical projection tomography (OPT), and on the corresponding inverse problem. The reconstruction problem is solved using emission projections corresponding to a series of rotational imaging positions of the sample. Similarly to the bright field OPT bearing resemblance with the transmission x-ray computed tomography, the fluorescent mode OPT is analogous to x-ray fluorescence tomography (XFCT). As an improved direct model for the fluorescent OPT, we derive a weighted Radon transform based on the XFCT literature. Moreover, we propose a simple and fast iteration scheme for the slice-wise reconstruction of the sample. The developed methods are applied in both numerical experiments and inversion of fluorescent OPT data from a zebrafish embryo. The results demonstrate the importance of propagation modelling and our analysis provides a flexible modelling framework for fluorescent OPT that can easily be modified to adapt to different imaging setups.publishedVersionPeer reviewe
    corecore